Skip to Content
Welcome to my more educational website. Head back to more fun stuff here!

Homogeneous Systems of Linear Constant-Coefficient Differential Equations

Introduction

A system of linear constant-coefficient differential equations can be written in matrix form as

X=AX+F,X'=AX+F,

where XRnX\in\mathbb{R}^n is an unknown vector-valued function, ARn×nA\in\mathbb{R}^{n\times n} is a constant coefficient matrix, and FRnF\in\mathbb{R}^n is the forcing term. When F=0F=0, we have a homogeneous system. In the homogeneous case, X0X\equiv 0 is always a solution and the set of all solutions forms an nn-dimensional vector space.

As in the single dimensional case, linearity leads to superposition, and specifying an initial condition X(t0)=X0X(t_0)=X_0 determines a unique solution.

A key feature of the system X=AXX'=AX is that all solution behavior is encoded in the matrix AA. The eigenvalues of AA determine the behavior of the system, and the eigenvectors (or generalized eigenvectors) determine the directions in which these behaviors occur.

Motivation: One-Dimensional Constant-Coefficient Linear ODEs

Recall that for a one-dimensional constant-coefficient linear ODE, we typically seek an exponential trial solution of the form erte^{rt}. For example, for the second-order homogeneous equation

x+bx+cx=0,x''+bx'+cx=0,

trying x=ertx=e^{rt} leads to the characteristic equation

r2+br+c=0.r^2+br+c=0.

which produces three standard cases:

  1. **Distinct real roots r1r2r_1\neq r_2:} x(t)=C1er1t+C2er2tx(t)=C_1e^{r_1 t}+C_2e^{r_2 t}.
  2. **Repeated real root r1=r2r_1=r_2:} x(t)=(C1+C2t)ertx(t)=(C_1+C_2 t)e^{rt}.
  3. **Complex roots r=α±iβr=\alpha\pm i\beta:} x(t)=eαt(C1cos(βt)+C2sin(βt))x(t)=e^{\alpha t}\big(C_1\cos(\beta t)+C_2\sin(\beta t)\big).

Generalization to nn Dimensions: Eigenvalues and Eigenvectors

For the homogeneous system

X=AX,X'=AX,

we use the analogous exponential ansatz

X(t)=eλtv,X(t)=e^{\lambda t}v,

where vv is a nonzero constant vector. Differentiating gives

X(t)=λveλt.X'(t)=\lambda v e^{\lambda t}.

Substituting into X=AXX'=AX and cancelling the always nonzero factor eλte^{\lambda t} yields

λv=Av(AλI)v=0.\lambda v=Av \quad\Longleftrightarrow\quad (A-\lambda I)v=0.

so nontrivial solutions exist only when

det(AλI)=0.\det(A-\lambda I)=0.

You may recall that this is precisely the characteristic equation of the matrix AA. Solving the equation for λ\lambda retrieves the eigenvalues of AA.

For each eigenvalue—eigenvector pair (λ,v)(\lambda,v), we obtain a solution X(t)=eλtvX(t)=e^{\lambda t}v. The eigenvector vv specifies the direction in space along which the exponential behavior eλte^{\lambda t} occurs.

As with scalar equations, there are three important cases for systems. In each case, the solution is built from eigenvalues and (generalized) eigenvectors, in direct analogy with characteristic roots and repeated roots in the one-dimensional setting.

Distinct Real Eigenvalues

Assume ARn×nA\in\mathbb{R}^{n\times n} has nn distinct real eigenvalues λ1,,λn\lambda_1,\dots,\lambda_n with corresponding eigenvectors v1,,vnv_1,\dots,v_n. Then the nn vector functions

Xi(t)=eλitvi,i=1,,nX_i(t)=e^{\lambda_i t}v_i,\qquad i=1,\dots,n

are linearly independent solutions of X=AXX'=AX. The general solution is the linear combination

X(t)=c1eλ1tv1++cneλntvn,X(t)=c_1 e^{\lambda_1 t}v_1+\cdots+c_n e^{\lambda_n t}v_n,

where the constants c1,,cnc_1,\dots,c_n are determined by an initial condition X(0)=X0X(0)=X_0.

Repeated Eigenvalues (Generalized Eigenvectors)

Suppose λ\lambda is an eigenvalue of AA with multiplicity m>1m>1. There are two cases for the eigenvectors:

  1. **mm eigenvectors:} we get mm linearly independent eigenvectors.
  2. **n<mn<m eigenvectors:} we get nn linearly independent eigenvectors, where n<mn<m.

Recall that we will need mm linearly independent eigenvectors to create a basis. We first count how many eigenvectors we have, and then use generalized eigenvectors to make up the difference.

To find eigenvectors for λ\lambda, recall that we solve the homogeneous linear system

(AλI)v=0.(A-\lambda I)v=0.

Row-reducing AλIA-\lambda I to RREF can let us count the number of eigenvectors relatively quickly: if the RREF has pp pivots, then there are npn-p free variables, so we get npn-p linearly independent eigenvectors for λ\lambda.

If there are np<mn-p < m eigenvectors, the eigenvectors will not give us a complete basis of solutions. To complete the basis for our solution space, we must find generalized eigenvectors.

Let vv be an eigenvector for λ\lambda, so (AλI)v=0(A-\lambda I)v=0. A generalized eigenvector ww is a vector that solves the linear system

(AλI)w=v.(A-\lambda I)w=v.

Once we have vv and ww, we obtain two independent solutions

X1(t)=eλtv,X2(t)=eλt(tv+w).X_1(t)=e^{\lambda t}v, \qquad X_2(t)=e^{\lambda t}(tv+w).

Then the general solution contributed by this repeated eigenvalue is

X(t)=c1X1(t)+c2X2(t)=c1eλtv+c2eλt(tv+w).X(t)=c_1 X_1(t)+c_2 X_2(t) =c_1 e^{\lambda t}v+c_2 e^{\lambda t}(tv+w).

If we still need more eigenvectors, we can continue the process by solving (AλI)u=w(A-\lambda I)u=w, and so on. Each new generalized eigenvector produces another solution with a higher power of tt multiplying eλte^{\lambda t}.

In general, if λ\lambda has algebraic multiplicity mm and we can build generalized eigenvectors v1,v2,,vmv_1,v_2,\dots,v_m with the recurrence relation

(AλI)v1=0,(AλI)vi=vi1  (i=2,,m),(A-\lambda I)v_1=0,\qquad (A-\lambda I)v_{i}=v_{i-1}\ \ (i=2,\dots,m),

then we obtain mm independent solutions

Xi(t)=eλt(ti1(i1)!v1+ti2(i2)!v2++tvi1+vi),i=1,,m,X_i(t)=e^{\lambda t}\left(\frac{t^{i-1}}{(i-1)!}v_1+\frac{t^{i-2}}{(i-2)!}v_2+\cdots+t v_{i-1}+v_i\right), \qquad i=1,\dots,m,

We can think of this as an analogy to the exponential response formula.

Complex Eigenvalues

For real matrices AA, nonreal eigenvalues occur in conjugate pairs λ=α±iβ\lambda=\alpha\pm i\beta with β0\beta\neq 0. If v=p+iqv=p+iq is an eigenvector for α+iβ\alpha+i\beta (with p,qRnp,q\in\mathbb{R}^n), then the complex solution

X(t)=e(α+iβ)t(p+iq)X(t)=e^{(\alpha+i\beta)t}(p+iq)

has real and imaginary parts that are real solutions. Using Euler’s formula, we obtain the real solution pair

X1(t)=eαt(pcos(βt)qsin(βt)),X2(t)=eαt(psin(βt)+qcos(βt)).X_1(t) = e^{\alpha t}\big(p\cos(\beta t)-q\sin(\beta t)\big),\\ X_2(t) = e^{\alpha t}\big(p\sin(\beta t)+q\cos(\beta t)\big).

This same construction generalizes directly to nn dimensions: each complex conjugate eigenvalue pair produces two linearly independent real solutions obtained from the real and imaginary parts of the corresponding complex solution.

If α±iβ\alpha\pm i\beta is a single complex conjugate pair, then the general (real) solution contributed by this pair is

X(t)=c1X1(t)+c2X2(t),X(t)=c_1 X_1(t)+c_2 X_2(t),

with real constants c1,c2c_1,c_2.

Examples

Distinct Real Eigenvalues

Consider the initial value problem

X=(3102)X,X(0)=(11).X'= \begin{pmatrix} 3 & 1 \\ 0 & 2 \end{pmatrix}X, \qquad X(0)= \begin{pmatrix} 1 \\ 1 \end{pmatrix}.

Since AA is upper triangular, its eigenvalues are the diagonal entries λ1=3\lambda_1=3 and λ2=2\lambda_2=2.

For λ1=3\lambda_1=3, we solve (A3I)v=0(A-3I)v=0:

(0101)v=0v1=(10).\begin{pmatrix} 0 & 1 \\ 0 & -1 \end{pmatrix}v=0 \quad\Rightarrow\quad v_1= \begin{pmatrix} 1 \\ 0 \end{pmatrix}.

For λ2=2\lambda_2=2, we solve (A2I)v=0(A-2I)v=0:

(1100)v=0v2=(11).\begin{pmatrix} 1 & 1 \\ 0 & 0 \end{pmatrix}v=0 \quad\Rightarrow\quad v_2= \begin{pmatrix} 1 \\ -1 \end{pmatrix}.

Thus the general solution is

X(t)=c1e3t(10)+c2e2t(11).X(t)=c_1 e^{3t} \begin{pmatrix} 1 \\ 0 \end{pmatrix} +c_2 e^{2t} \begin{pmatrix} 1 \\ -1 \end{pmatrix}.

Applying the initial condition and solving for c1c_1 and c2c_2

X(0)=(11)X(0)= \begin{pmatrix} 1 \\ 1 \end{pmatrix}

gives c2=1c_2=-1 and c1=2c_1=2, so

X(t)=(2e3te2te2t).X(t)= \begin{pmatrix} 2e^{3t}-e^{2t} \\ e^{2t} \end{pmatrix}.

Repeated Eigenvalues with Generalized Eigenvector

Consider the homogeneous system

X=(2102)X.X'= \begin{pmatrix} 2 & 1 \\ 0 & 2 \end{pmatrix}X.

The characteristic polynomial is (2λ)2(2-\lambda)^2, so AA has a repeated eigenvalue λ=2\lambda=2.

Solving (A2I)v=0(A-2I)v=0 gives

A2I=(0100),(0100)v=0  v=(10).A-2I= \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}, \qquad \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}v=0 \ \Rightarrow\ v= \begin{pmatrix} 1 \\ 0 \end{pmatrix}.

Note that as there is 1 pivot, we have np21=1n-p \to 2 - 1 = 1 inearly independent eigenvectors.

To obtain a second independent solution, we find a generalized eigenvector ww by solving the linear system

(A2I)w=v.(A-2I)w=v.

Here

A2I=(0100),v=(10),A-2I= \begin{pmatrix} 0 & 1\\ 0 & 0 \end{pmatrix}, \qquad v= \begin{pmatrix} 1\\ 0 \end{pmatrix},

so ww must satisfy

(0100)(w1w2)=(10).\begin{pmatrix} 0 & 1\\ 0 & 0 \end{pmatrix} \begin{pmatrix} w_1\\ w_2 \end{pmatrix} = \begin{pmatrix} 1\\ 0 \end{pmatrix}.

From the first row we get w2=1w_2=1, and w1w_1 is free, so one convenient choice is

w=(01).w= \begin{pmatrix} 0\\ 1 \end{pmatrix}.

Then two independent solutions are

X1(t)=e2tv,X2(t)=e2t(tv+w),X_1(t)=e^{2t}v, \qquad X_2(t)=e^{2t}\big(tv+w\big),

and therefore the general solution is

X(t)=c1X1(t)+c2X2(t)=c1e2tv+c2e2t(tv+w).X(t)=c_1 X_1(t)+c_2 X_2(t) =c_1 e^{2t}v+c_2 e^{2t}(tv+w).

Repeated Eigenvalues with Two Eigenvectors

Consider

X=(2002)X.X'= \begin{pmatrix} 2 & 0\\ 0 & 2 \end{pmatrix}X.

Here λ=2\lambda=2 has multiplicity 22, and every nonzero vector is an eigenvector. In particular, two linearly independent eigenvectors are

v1=(10),v2=(01).v_1=\begin{pmatrix}1\\0\end{pmatrix}, \qquad v_2=\begin{pmatrix}0\\1\end{pmatrix}.

Thus two independent solutions are e2tv1e^{2t}v_1 and e2tv2e^{2t}v_2, and the general solution is

X(t)=c1e2t(10)+c2e2t(01).X(t)=c_1 e^{2t}\begin{pmatrix}1\\0\end{pmatrix}+c_2 e^{2t}\begin{pmatrix}0\\1\end{pmatrix}.

Complex Eigenvalues

Consider

X=(0110)X.X'= \begin{pmatrix} 0 & -1\\ 1 & 0 \end{pmatrix}X.

The eigenvalues are λ=±i\lambda=\pm i (a complex conjugate pair). Using the formulas from the complex-eigenvalue case, we obtain two real solutions

X1(t)=(costsint),X2(t)=(sintcost),X_1(t)=\begin{pmatrix}\cos t\\ \sin t\end{pmatrix}, \qquad X_2(t)=\begin{pmatrix}-\sin t\\ \cos t\end{pmatrix},

and therefore the general solution is

X(t)=c1(costsint)+c2(sintcost).X(t)=c_1\begin{pmatrix}\cos t\\ \sin t\end{pmatrix}+c_2\begin{pmatrix}-\sin t\\ \cos t\end{pmatrix}.
Last updated on